Search for Relevance in Ranking
by Mike Banks Valentine

Search engines used to rely on webmasters to provide an
honest assessment of web site content by simply accepting
exactly what was placed in the meta tags of each site.

The engines ranked the sites in the beginning by looking
almost entirely at those tags and serving results based
on whether search words and phrases were contained in the
meta tags. That was when everyone publishing web pages
was sweet, naive and honest as twelve-year-old boyscouts.

Ranking algorithms began getting immensely more complex as
webmasters started manipulating search results either by
lying about the content of the site in those meta tags, or
by stuffing those tags with keyword phrases. Meanwhile,
search engines started looking for formulas that could
deliver reliable and relevant results without relying on
honesty in meta tags.

Google rose to prominence in a very short time by placing
far less emphasis on meta tags and far more emphasis on
popularity, based on inbound links from other web sites.

I recently wrote an article entitled, "Reciprocal Linking
is Dead!" after discovering a dishonest manipulation of
that technique to increase affiliate sales through URL
cloaking and redirected link schemes.

http://searchengineoptimism.com/reciprocal-linking-dead.html

Google's PageRank algorithm, combined with the speed of
results pages, made them hugely popular with searchers,
who seemed to find those results pages more relevant to
their searches. Google continues adjusting algorithms
to include site themes, interior linking structure, body
text keyword density, frequency of updates, etc.

Search over the past five years had experienced dramatic
expansion of players in the late nineties internet boom,
followed by implosions of mergers, death & destruction.
A brief and humorous article offers a recap of search
engine births, marriages and affairs over this period.

http://searchengineoptimism.com/Yahoo_acquire_Overture.html

Recently, while search providers consolidated to fewer
than a dozen major players from the hundreds of search
engines of just a few years ago, optimistic new players
have emerged seeking a piece of the search pie. Newest
on the block is optimistically open source Nutch.com,
which promises to openly publish their search algorithms
while others continue to hide their proprietary formulae
for relevant search results.

Meanwhile there are ongoing attempts by some engines to
improve algorithms to reduce webmaster manipulation,
increase relevancy and produce ever better results.

This week, the small, but very aggressive search engine
http://www.ExactSeek.com announced a partnership with
Alexa.com to adjust their ranking relevance using the
popularity index of Alexa site rank. This move will,
no doubt, anger many smaller webmasters whose Alexa
rankings are lower than larger competitors. Smaller
localized businesses that compete against national
chain retailers will likely slip off the radar under
this scheme. Substantial, quality content tips the
scales back in favor of smaller content sites here.

But is this different than the popularity index scheme
that Google uses determined by inbound links? Will those
web sites in the results at ExactSeek with thousands of
hard-won links disappear from the results pages when the
Alexa factor is applied to ExactSeek algorithm?

I did a brief and unscientific test across the biggest
search engines by searching for my own site at each one.
One search phrase I've targeted is "Small business
Internet Tutorial" and the results are enlightening.

MSN search returns my site, WebSite101.com as #1 result.
Google, I'm proud to say, returns exactly the same #1.
AlltheWeb.com gives WebSite101 position #6, still good.
Teoma.com returns a results page with WebSite101 at #1.

The big boys of search comparing this well established
and fairly popular site - all seem to produce similar
results, all using differing algorithms to reach that
conclusion.

ExactSeek currently ranks WebSite101 at position #2.

I had assumed that bigger numbers meant more pageviews.
Alexa ranking is based on visits to web sites by surfers
using the Alexa toolbar plug-in from Internet Explorer
browsers. Their ranking formula is not intuitive and a
bit backward to understand. WebSite101 had recently been
ranked at about 50,000 in Alexa results and I worried
when I discovered that it had changed to 47,000 lately.

It turns out that this is actually a better ranking as
I discovered after visiting the Alexa site to research
this article. The lower the number, the closer you move
to #1 in the Alexa rankings. That vaunted #1 position is
held by Yahoo!, followed by Microsoft.com and then by
Google, not surprisingly.

http://www.alexa.com/site/ds/top_500

The controversy that could be generated by incorporating
Alexa rankings into ExactSeek results will very likely
be that a smaller search engine is basing it's results
on popularity among a relatively small group of Alexa
toolbar users who visit Microsoft more often than they
might visit WebSite101. The Alexa toolbar has been down-
loaded 10 million times, according to their website and
estimates at least 1 million users with the toolbar are
surfing at any one time.

For a primer on all toolbars, visit Search Engine Watch

http://searchenginewatch.com/links/article.php/2156381

I'd like to encourage everyone to visit WebSite101, on
my front page, click on "Install Alexa Toolbar" banner
at the bottom of the home page and help move WebSite101
up to position 45,000 by visiting me regularly, which
will increase my rank to #1 from #2 at ExactSeek. ;-)

Webmasters will inevitably work to increase their site
rankings by any means they can, but site popularity is
less susceptible to those machinations of aggressive
webmastering than say, keyword stuffing, reciprocal
links farms, etc. Those unethical techniques could lead
to being banned by the search engines. While web site
popularity is susceptible to traffic exchange schemes,
it tends to level off after a period of time as traffic
peaks and again declines. Alexa aggregates info across
three month periods, which negates brief traffic spikes.

Time will tell whether the ExactSeek ranking scheme will
be welcomed by searchers, but first impression is that
Alexa will favor the popularity of Goliath over David
(smaller sites). ExactSeek can incorporate rank from
Alexa while filtering out the Goliath factor in their
algorithms. Getting the mix right will determine
ExactSeek popularity among webmasters.

Searchers will ultimately decide if they like ExactSeek
results.


-------------------------------------------------------
Mike Banks Valentine is a Search Engine Optimization
Specialist practicing ethical SEO for Online businesses
http://SEOptimism.com
Take our Search Engine Quiz to test your Skills Level
http://SearchEngineOptimism.com/search_engine_quiz.html
-------------------------------------------------------